Posterior distribution analysis for Bayesian inference in neural networks

نویسندگان

  • Pavel Myshkov
  • Simon Julier
چکیده

This study explores the posterior predictive distributions obtained with various Bayesian inference methods for neural networks. The quality of the distributions is assessed both visually and quantitatively using Kullback–Leibler (KL) divergence, Kolmogorov–Smirnov (KS) distance and precision-recall scores. We perform the analysis using a synthetic dataset that allows for a more detailed examination of the methods, and validate the findings on larger datasets. We find that among the recently proposed techniques, the simpler ones – Stochastic Gradient Langevin Dynamics (SGLD) and MC Dropout – are able to consistently provide good approximations to the “true” posterior, at the same time not requiring extensive tuning of parameters.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

An Introduction to Inference and Learning in Bayesian Networks

Bayesian networks (BNs) are modern tools for modeling phenomena in dynamic and static systems and are used in different subjects such as disease diagnosis, weather forecasting, decision making and clustering. A BN is a graphical-probabilistic model which represents causal relations among random variables and consists of a directed acyclic graph and a set of conditional probabilities. Structure...

متن کامل

Bayesian Estimation of Parameters in the Exponentiated Gumbel Distribution

Abstract: The Exponentiated Gumbel (EG) distribution has been proposed to capture some aspects of the data that the Gumbel distribution fails to specify. In this paper, we estimate the EG's parameters in the Bayesian framework. We consider a 2-level hierarchical structure for prior distribution. As the posterior distributions do not admit a closed form, we do an approximated inference by using ...

متن کامل

A Bayesian Approach to Online Learning

Online learning is discussed from the viewpoint of Bayesian statistical inference. By replacing the true posterior distribution with a simpler parametric distribution, one can define an online algorithm by a repetition of two steps: An update of the approximate posterior, when a new example arrives, and an optimal projection into the parametric family. Choosing this family to be Gaussian, we sh...

متن کامل

Bayesian Learning via Stochastic Dynamics

The attempt to find a single "optimal" weight vector in conventional network training can lead to overfitting and poor generalization. Bayesian methods avoid this, without the need for a validation set, by averaging the outputs of many networks with weights sampled from the posterior distribution given the training data. This sample can be obtained by simulating a stochastic dynamical system th...

متن کامل

Estimation of Products Final Price Using Bayesian Analysis Generalized Poisson Model and Artificial Neural Networks

Estimating the final price of products is of great importance. For manufacturing companies proposing a final price is only possible after the design process over. These companies propose an approximate initial price of the required products to the customers for which some of time and money is required. Here using the existing data of already designed transformers and utilizing the bayesian anal...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2016